Attraction Radii in Binary Hoppeld Nets Are Hard to Compute
نویسندگان
چکیده
We prove that it is an NP-hard problem to determine the attraction radius of a stable vector in a binary Hoppeld memory network, and even that the attraction radius is hard to approximate. Under synchronous updating, the problems are already NP-hard for two-step attraction radii; direct (one-step) attraction radii can be computed in polynomial time. A Hoppeld memory network 6] consists of n binary valued nodes, or \neurons." We index the nodes by f1; : : :; ng, and choose f?1; +1g as their possible states (the values f0; 1g could be chosen equally well). Associated to each pair of nodes i; j is an interconnection weight w ij. The interconnections are symmetric, so that w ij = w ji for each i; j; moreover, w ii = 0 for each i. In addition, each node i has an internal threshold value t i. We denote the matrix of interconnection weights by W = (w ij), and the vector of threshold values by t = (t 1 ; t 2 ; : : :; t n). At any given moment, each node i in the network has a state x i , which is either ?1 or +1. The state at the next moment is determined as a function of the states of the other nodes as Work supported by the Academy of Finland.
منابع مشابه
Attraction Radii in Binary Hopfield Nets are Hard to Compute
We prove that it is an NP-hard problem to determine the attraction radius of a stable vector in a binary Hoppeld memory network, and even that the attraction radius is hard to approximate. Under synchronous updating, the problems are already NP-hard for two-step attraction radii; direct (one-step) attraction radii can be computed in polynomial time. A Hoppeld memory network 6] consists of n bin...
متن کاملA Continuous-time Hoppeld Net Simulation of Discrete Neural Networks a Continuous-time Hoppeld Net Simulation of Discrete Neural Networks
We investigate the computational power of continuous-time symmetric Hoppeld nets. Since the dynamics of such networks are governed by Liapunov (energy) functions, they cannot generate innnite nondamping oscillations, and hence cannot simulate arbitrary (potentially divergent) discrete computations. Nevertheless, we prove that any conver-gent fully parallel computation by a network of n discrete...
متن کاملA Continuous-time Hoppeld Net Simulation of Discrete Neural Networks Institute of Computer Science Academy of Sciences of the Czech Republic a Continuous-time Hoppeld Net Simulation of Discrete Neural Networks
We investigate the computational power of continuous-time symmetric Hoppeld nets. Since the dynamics of such networks are governed by Liapunov (energy) functions, they cannot generate innnite nondamping oscillations, and hence cannot simulate arbitrary (potentially divergent) discrete computations. Nevertheless, we prove that any conver-gent fully parallel computation by a network of n discrete...
متن کاملThe Computational Power of Discrete Hoppeld Nets with Hidden Units
We prove that polynomial size discrete Hoppeld networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial space-bounded nonuniform Turing machines. As a corollary to the construction, we observe also that networks with polynomially bounded interconnection weights compute exactly the class of functions P/poly, i.e...
متن کاملSome Afterthoughts on Hoppeld Networks
The present paper investigates four relatively independent issues, each in one section, which complete our knowledge regarding the computational aspects of popular Hoppeld nets 9]. In Section 2, the computational equivalence of convergent asymmetric and Hoppeld nets is shown with respect to the network size. In Section 3, the convergence time of Hoppeld nets is analyzed in terms of bit represen...
متن کامل